Independent Low-Rank Matrix Analysis Based on Generalized Kullback-Leibler Divergence

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Non-negative Matrix Factorization with Generalized Kullback-Leibler Divergence

Non-negative Matrix Factorization (NMF), especially with sparseness constraints, plays a critically important role in data engineering and machine learning. Hoyer (2004) presented an algorithm to compute NMF with exact sparseness constraints. The exact sparseness constraints depends on a projection operator. In the present work, we first give a very simple counterexample, for which the projecti...

متن کامل

Model Confidence Set Based on Kullback-Leibler Divergence Distance

Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...

متن کامل

Kullback-Leibler Divergence for Nonnegative Matrix Factorization

The I-divergence or unnormalized generalization of KullbackLeibler (KL) divergence is commonly used in Nonnegative Matrix Factorization (NMF). This divergence has the drawback that its gradients with respect to the factorizing matrices depend heavily on the scales of the matrices, and learning the scales in gradient-descent optimization may require many iterations. This is often handled by expl...

متن کامل

Notes on Kullback-Leibler Divergence and Likelihood

The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies the proximity of two probability distributions. Although difficult to understand by examining the equation, an intuition and understanding of the KL divergence arises from its intimate relationship with likelihood theory. We discuss how KL divergence arises from likelihood theory in an attempt t...

متن کامل

Computing the Kullback-Leibler Divergence between two Generalized Gamma Distributions

We derive a closed form solution for the Kullback-Leibler divergence between two generalized gamma distributions. These notes are meant as a reference and provide a guided tour towards a result of practical interest that is rarely explicated in the literature. 1 The Generalized Gamma Distribution The origins of the generalized gamma distribution can be traced back to work of Amoroso in 1925 [1,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences

سال: 2019

ISSN: 0916-8508,1745-1337

DOI: 10.1587/transfun.e102.a.458